10 research outputs found
Cross-lingual AMR Aligner: Paying Attention to Cross-Attention
This paper introduces a novel aligner for Abstract Meaning Representation
(AMR) graphs that can scale cross-lingually, and is thus capable of aligning
units and spans in sentences of different languages. Our approach leverages
modern Transformer-based parsers, which inherently encode alignment information
in their cross-attention weights, allowing us to extract this information
during parsing. This eliminates the need for English-specific rules or the
Expectation Maximization (EM) algorithm that have been used in previous
approaches. In addition, we propose a guided supervised method using alignment
to further enhance the performance of our aligner. We achieve state-of-the-art
results in the benchmarks for AMR alignment and demonstrate our aligner's
ability to obtain them across multiple languages. Our code will be available at
\href{https://www.github.com/Babelscape/AMR-alignment}{github.com/Babelscape/AMR-alignment}.Comment: ACL 2023. Please cite authors correctly using both lastnames
("Mart\'inez Lorenzo", "Huguet Cabot"
Incorporating Graph Information in Transformer-based AMR Parsing
Abstract Meaning Representation (AMR) is a Semantic Parsing formalism that
aims at providing a semantic graph abstraction representing a given text.
Current approaches are based on autoregressive language models such as BART or
T5, fine-tuned through Teacher Forcing to obtain a linearized version of the
AMR graph from a sentence. In this paper, we present LeakDistill, a model and
method that explores a modification to the Transformer architecture, using
structural adapters to explicitly incorporate graph information into the
learned representations and improve AMR parsing performance. Our experiments
show how, by employing word-to-node alignment to embed graph structural
information into the encoder at training time, we can obtain state-of-the-art
AMR parsing through self-knowledge distillation, even without the use of
additional data. We release the code at
\url{http://www.github.com/sapienzanlp/LeakDistill}.Comment: ACL 2023. Please cite authors correctly using both lastnames
("Mart\'inez Lorenzo", "Huguet Cabot"
Healthcare workers hospitalized due to COVID-19 have no higher risk of death than general population. Data from the Spanish SEMI-COVID-19 Registry
Aim To determine whether healthcare workers (HCW) hospitalized in Spain due to COVID-19 have a worse prognosis than non-healthcare workers (NHCW). Methods Observational cohort study based on the SEMI-COVID-19 Registry, a nationwide registry that collects sociodemographic, clinical, laboratory, and treatment data on patients hospitalised with COVID-19 in Spain. Patients aged 20-65 years were selected. A multivariate logistic regression model was performed to identify factors associated with mortality. Results As of 22 May 2020, 4393 patients were included, of whom 419 (9.5%) were HCW. Median (interquartile range) age of HCW was 52 (15) years and 62.4% were women. Prevalence of comorbidities and severe radiological findings upon admission were less frequent in HCW. There were no difference in need of respiratory support and admission to intensive care unit, but occurrence of sepsis and in-hospital mortality was lower in HCW (1.7% vs. 3.9%; p = 0.024 and 0.7% vs. 4.8%; p<0.001 respectively). Age, male sex and comorbidity, were independently associated with higher in-hospital mortality and healthcare working with lower mortality (OR 0.211, 95%CI 0.067-0.667, p = 0.008). 30-days survival was higher in HCW (0.968 vs. 0.851 p<0.001). Conclusions Hospitalized COVID-19 HCW had fewer comorbidities and a better prognosis than NHCW. Our results suggest that professional exposure to COVID-19 in HCW does not carry more clinical severity nor mortality
MAGIC and H.E.S.S. detect VHE gamma rays from the blazar OT081 for the first time: a deep multiwavelength study
https://pos.sissa.it/395/815/pdfPublished versio
BabelNet Meaning Representation: A Fully Semantic Formalism to Overcome Language Barriers
Conceptual representations of meaning have long been the general focus of Artificial Intelligence (AI) towards the fundamental goal of machine understanding, with innumerable efforts made in Knowledge Representation, Speech and Natural Language Processing, Computer Vision, inter alia. Even today, at the core of Natural Language Understanding lies the task of Semantic Parsing, the objective of which is to convert natural sentences into machine-readable representations. Through this paper, we aim to revamp the historical dream of AI, by putting forward a novel, all-embracing, fully semantic meaning representation, that goes beyond the many existing formalisms. Indeed, we tackle their key limits by fully abstracting text into meaning and introducing language-independent concepts and semantic relations, in order to obtain an interlingual representation. Our proposal aims to overcome the language barrier, and connect not only texts across languages, but also images, videos, speech and sound, and logical formulas, across many fields of AI
Asamblea multisectorial de 1974 : Cadena nacional de difusión de la reunión multisectorial convocada por la presidente María Estela Martínez de Perón con motivo de la violencia subversiva
El 8 de octubre se realiza en el Casa de Gobierno una Reunión Multisectorial, convocada por la presidente María Estela Martínez de Perón con motivo de la violencia expresada en atentados y "actos subversivos". Lo que se escucha es la "versión sintetizada" emitida por cadena nacional de radio y televisión. Concurren representantes de los partidos políticos y de las organizaciones sindicales. Todos ellos, dirigiéndose a la presidente, repudian la violencia, ratifican su apoyo a las instituciones y mencionan de manera explícita el malestar en las Fuerzas Armadas que va a desembocar en el golpe de estado del 24 de marzo de 1976.
La presidente abre y cierra la reunión agradeciendo la concurrencia y comprometiéndose a continuar el diálogo. En sus palabras, se emociona, alude varias veces al fallecido presidente Juan Domingo Perón y se refiere a su sucesión.
En este tramo de la cadena nacional se escucha a:
-Américo Ghioldi por el Partido Socialista Democrático
-Arturo Ponsati por el Partido Revolucionario Cristiano
-Víctor García Costa por el Partido Socialista Popular
-Juan Carlos Coral por el Partido Socialista de los Trabajadores
-Carmelo Vinti por Unión Popular
-Ricardo Balbín por la Unión Cívica Radical
-Lorenzo Miguel por las 62 Organizaciones
-Presidenta María Estela Martínez de Perón y cierre de la cadena nacional de radio y televisión.Radio Universidad Nacional de La Plat
Asamblea multisectorial de 1974 : Cadena nacional de difusión de la reunión multisectorial convocada por la presidente María Estela Martínez de Perón con motivo de la violencia subversiva
El 8 de octubre se realiza en el Casa de Gobierno una Reunión Multisectorial, convocada por la presidente María Estela Martínez de Perón con motivo de la violencia expresada en atentados y "actos subversivos". Lo que se escucha es la "versión sintetizada" emitida por cadena nacional de radio y televisión. Concurren representantes de los partidos políticos y de las organizaciones sindicales. Todos ellos, dirigiéndose a la presidente, repudian la violencia, ratifican su apoyo a las instituciones y mencionan de manera explícita el malestar en las Fuerzas Armadas que va a desembocar en el golpe de estado del 24 de marzo de 1976.
La presidente abre y cierra la reunión agradeciendo la concurrencia y comprometiéndose a continuar el diálogo. En sus palabras, se emociona, alude varias veces al fallecido presidente Juan Domingo Perón y se refiere a su sucesión.
En este tramo de la cadena nacional se escucha a:
-Américo Ghioldi por el Partido Socialista Democrático
-Arturo Ponsati por el Partido Revolucionario Cristiano
-Víctor García Costa por el Partido Socialista Popular
-Juan Carlos Coral por el Partido Socialista de los Trabajadores
-Carmelo Vinti por Unión Popular
-Ricardo Balbín por la Unión Cívica Radical
-Lorenzo Miguel por las 62 Organizaciones
-Presidenta María Estela Martínez de Perón y cierre de la cadena nacional de radio y televisión.Radio Universidad Nacional de La Plat
Intensity interferometry with the MAGIC telescopes
Due to their large mirror size, fast response to single photons, sensitivity and telescope baselines
in the order of 100 m, Imaging Atmospheric Cherenkov Telescopes are ideally suited to perform
intensity interferometry observations. In 2019 a test readout setup was installed in the two 17-m
diameter MAGIC telescopes to allow performing interferometry measurements with them. The
first on-sky measurements were able to detect correlated intensity fluctuations consistent with the
stellar diameters of three different stars: Adhara (n CMa), Benetnasch ([ UMa) and Mirzam (V
CMa). After the upgrade of the setup in 2021, MAGIC is now equipped with a high-duty-cycle
intensity interferometer, already in operation. A technical description of the interferometer and
first performance results obtained by measuring several known stellar diameter are presented